Begin typing your search...

AMD launches new chips to run large language models in advanced GenAI era

AMD has announced new accelerators and processors for running large language models or LLMs, as graphics chip maker Nvidia leads the generative AI chips race.

image for illustrative purpose

AMD launches new chips to run large language models in advanced GenAI era
X

7 Dec 2023 4:24 PM IST

San Francisco, Dec 7: AMD has announced new accelerators and processors for running large language models or LLMs, as graphics chip maker Nvidia leads the generative AI chips race.

The AMD Instinct MI300X chip offers industry-leading memory bandwidth for generative AI and leadership performance for large language model (LLM) training and inferencing while as the AMD Instinct MI300A accelerated processing unit (APU) combines latest AMD CDNA 3 architecture and “Zen 4” CPUs to deliver breakthrough performance for HPC and AI workloads.

“AMD Instinct MI300 Series accelerators are designed with our most advanced technologies, delivering leadership performance, and will be in large scale cloud and enterprise deployments,” said Victor Peng, president, AMD.

“By leveraging our leadership hardware, software and open ecosystem approach, cloud providers, OEMs and ODMs are bringing to market technologies that empower enterprises to adopt and deploy AI-powered solutions,” Peng added.

Customers leveraging the latest AMD Instinct accelerator portfolio include Dell Technologies, Hewlett Packard Enterprise, Lenovo, Meta, Microsoft, Oracle, Supermicro and others.

AMD Instinct MI300X delivers nearly 40 per cent more compute units, 1.5x more memory capacity, 1.7x more peak theoretical memory bandwidth as well as support for new math formats such as FP8 and sparsity -- all geared towards AI and HPC workloads.

AMD Instinct MI300X accelerators feature class 192GB memory capacity as well as 5.3 TB per second peak memory bandwidth, to deliver the performance needed for increasingly demanding AI workloads, said the company.

The AMD Instinct Platform is a leadership generative AI platform built on an industry standard OCP design with eight MI300X accelerators to offer an industry leading 1.5TB of HBM3 memory capacity.

Nvidia GenAI LLM APU generative AI chips AMD 
Next Story
Share it